Information Rates of Nonparametric Gaussian Process Methods

نویسندگان

  • Aad van der Vaart
  • Harry van Zanten
چکیده

We consider the quality of learning a response function by a nonparametric Bayesian approach using a Gaussian process (GP) prior on the response function. We upper bound the quadratic risk of the learning procedure, which in turn is an upper bound on the Kullback-Leibler information between the predictive and true data distribution. The upper bound is expressed in small ball probabilities and concentration measures of the GP prior. We illustrate the computation of the upper bound for the Matérn and squared exponential kernels. For these priors the risk, and hence the information criterion, tends to zero for all continuous response functions. However, the rate at which this happens depends on the combination of true response function and Gaussian prior, and is expressible in a certain concentration function. In particular, the results show that for good performance, the regularity of the GP prior should match the regularity of the unknown response function.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Asymptotic Behaviors of the Lorenz Curve for Left Truncated and Dependent Data

The purpose of this paper is to provide some asymptotic results for nonparametric estimator of the Lorenz curve and Lorenz process for the case in which data are assumed to be strong mixing subject to random left truncation. First, we show that nonparametric estimator of the Lorenz curve is uniformly strongly consistent for the associated Lorenz curve. Also, a strong Gaussian approximation for ...

متن کامل

Nonparametric Bayesian Methods

. Most of this book emphasizes frequentist methods, especially for nonparametric problems. However, there are Bayesian approaches to many nonparametric problems. In this chapter we present some of the most commonly used nonparametric Bayesian methods. These methods place priors on infinite dimensional spaces. The priors are based on certain stochastic processes called Dirichlet processes and Ga...

متن کامل

Adaptive Bayesian Procedures Using Random Series Priors

We consider a general class of prior distributions for nonparametric Bayesian estimation which uses finite random series with a random number of terms. A prior is constructed through distributions on the number of basis functions and the associated coefficients. We derive a general result on adaptive posterior contraction rates for all smoothness levels of the target function in the true model ...

متن کامل

Adaptive Bayesian procedures using random series prior

We consider a prior for nonparametric Bayesian estimation which uses finite random series with a random number of terms. The prior is constructed through distributions on the number of basis functions and the associated coefficients. We derive a general result on adaptive posterior convergence rates for all smoothness levels of the function in the true model by constructing an appropriate “siev...

متن کامل

Bayesian inference with rescaled Gaussian process priors

Abstract: We use rescaled Gaussian processes as prior models for functional parameters in nonparametric statistical models. We show how the rate of contraction of the posterior distributions depends on the scaling factor. In particular, we exhibit rescaled Gaussian process priors yielding posteriors that contract around the true parameter at optimal convergence rates. To derive our results we e...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Journal of Machine Learning Research

دوره 12  شماره 

صفحات  -

تاریخ انتشار 2011